The difficult step in effecting forecasting process change (1 of 2)

0

Two weeks ago we looked at the first two steps in effecting forecasting process change:

  1. Justify your suspicions with data
  2. Communicate your findings

That was the easy part. So why is it that so many organization realize they have a forecasting problem, yet are unable to do anything about it? A new case study by Fildes and Goodwin* (F&G) provides insight on how an inefficient forecasting process (with a forecasting support system at its heart) can exist for many years.

Research Method

Most forecasting research has been about creating new modeling methods, or evaluating the performance of existing methods. Initiatives like the M4 Forecasting Competition (and the current M5) have contributed greatly to our knowledge in the area of modeling methods.

The objective of the F&G study, however, was to understand how forecasters go about their organizational tasks when using modeling methods through a forecasting support system. They used a case study approach, including direct observation of the forecasting process along with interviews of participants. This allowed for a deep account of how managers use and interact with their forecasting systems, and provides for better understanding of the psychological and political aspects that motivate each individual's behavior.

The study began in 2004, with visits by the researchers to the subject company (a regional subsidiary in the pharmaceutical industry), for interviews and observations.The existing forecasting system was generally well regarded, and thought to provide accuracy improvement over their prior approach (although with no data to support that thought). Nevertheless, a Six Sigma project had been initiated on forecasting because of the amount of effort required to produce forecasts, and concern that accuracy could be better.

The Forecasting Process

Pharmaceutical products commonly follow distinctive life-cycle patterns. The forecasters often used their judgment of what the forecast "should look like" to override the system generated forecast. This became the baseline forecast for consideration in the monthly product group review meetings, where "market intelligence" (MI) was applied by product management, and discussion ensued until forecasts were jointly agreed upon.

An important question at this point, when using the Forecast Value Added mindset, is whether these adjustments to the original statistical forecasts improved accuracy? F&G investigated the effect of judgmental overrides with great difficulty, because the original automatic statistical baseline forecasts were not recorded! (This is an extremely common and unfortunate oversight -- please don't repeat it at your organization!) So the best they could do was simulate what those original forecasts would have been, using software with a similar algorithm.

Analysis suggested the the judgmentally adjusted baseline forecasts were slightly less accurate than the original statistical forecasts -- not much harm done, just a lot of wasted effort. Looking next at the MI adjustments to the baseline forecasts, moderate improvements were sometimes seen. However, just 51.3% of the MI adjustments improved accuracy (the most successful adjustments tended to be larger). And less than 45% of the smallest adjustments improved accuracy.

A good takeaway from this, and similar findings in other studies, is that small adjustments are not worth the effort. Even if the adjustment is directionally correct and makes the forecast more accurate -- will there be any impact? A small adjustment makes, at best, a small improvement in accuracy. If nobody notices, and no better decisions or actions are taken, you've simply wasted time.

To be continued...

------------

Fildes, R. & Goodwin, P. (2020). Stability and innovation in the use of forecasting systems: a case study in a supply-chain company. Department of Management Science Working Paper 2020:1. Lancaster University.

 

Share

About Author

Mike Gilliland

Product Marketing Manager

Michael Gilliland is a longtime business forecasting practitioner and formerly a Product Marketing Manager for SAS Forecasting. He is on the Board of Directors of the International Institute of Forecasters, and is Associate Editor of their practitioner journal Foresight: The International Journal of Applied Forecasting. Mike is author of The Business Forecasting Deal (Wiley, 2010) and former editor of the free e-book Forecasting with SAS: Special Collection (SAS Press, 2020). He is principal editor of Business Forecasting: Practical Problems and Solutions (Wiley, 2015) and Business Forecasting: The Emerging Role of Artificial Intelligence and Machine Learning (Wiley, 2021). In 2017 Mike received the Institute of Business Forecasting's Lifetime Achievement Award. In 2021 his paper "FVA: A Reality Check on Forecasting Practices" was inducted into the Foresight Hall of Fame. Mike initiated The Business Forecasting Deal blog in 2009 to help expose the seamy underbelly of forecasting practice, and to provide practical solutions to its most vexing problems.

Comments are closed.

Back to Top